A risk-unbiased approach to a new Cramér-Rao bound
نویسندگان
چکیده
How accurately can one estimate a deterministic parameter subject to other unknown deterministic model parameters? The most popular answer to this question is given by the Cramér-Rao bound (CRB). The main assumption behind the derivation of the CRB is local unbiased estimation of all model parameters. The foundations of this work rely on doubting this assumption. Each parameter in its turn is treated as a single parameter of interest, while the other model parameters are treated as nuisance, as their mis-knowledge interferes with the estimation of the parameter of interest. Correspondingly, a new Cramér-Rao-type bound on the mean squared error (MSE) of non-Bayesian estimators is established with no unbiasedness condition on the nuisance parameters. Alternatively, Lehmann’s concept of unbiasedness is imposed for a risk that measures the distance between the estimator and the locally best unbiased (LBU) estimator which assumes perfect knowledge of the nuisance parameters. The proposed bound is compared to the CRB and MSE of the maximum likelihood estimator (MLE). Simulations show that the proposed bound provides a tight lower bound for this estimator, compared with the CRB.
منابع مشابه
When the Cramér-rao Inequality Provides No Information
We investigate a one-parameter family of probability densities (related to the Pareto distribution, which describes many natural phenomena) where the Cramér-Rao inequality provides no information. 1. Cramér-Rao Inequality One of the most important problems in statistics is estimating a population parameter from a finite sample. As there are often many different estimators, it is desirable to be...
متن کاملNew Cramer-Rao-Type Bound for Constrained Parameter Estimation
Non-Bayesian parameter estimation under parametric constraints is encountered in numerous applications in signal processing, communications, and control. Mean-squared-error (MSE) lower bounds are widely used as performance benchmarks and for system design. The well-known constrained Cramér-Rao bound (CCRB) is a lower bound on the MSE of estimators that satisfy some unbiasedness conditions. In m...
متن کاملInformation Inequality for Estimation of Transfer Functions: Main Results
In this paper we derive a canonical lower bound for the autocovariance function of any unbiased transfer-function estimator. As a generalization of the Cramér-Rao bound, the Cramér-Rao kernel that we define can be derived without parametrizing the model set. The Cramér-Rao kernel is thus one of the cornerstones for experiment design formulations that do not depend on the choice of coordinates.
متن کاملCovariance , Subspace , and Intrinsic Cramér - Rao Bounds Steven
Cramér-Rao bounds on estimation accuracy are established for estimation problems on arbitrary manifolds in which no set of intrinsic coordinates exists. The frequently encountered examples of estimating either an unknown subspace or a covariance matrix are examined in detail. The set of subspaces, called the Grassmann manifold, and the set of covariance (positive-definite Hermitian) matrices ha...
متن کاملRethinking Biased Estimation: Improving Maximum Likelihood and the Cramér-Rao Bound
One of the prime goals of statistical estimation theory is the development of performance bounds when estimating parameters of interest in a given model, as well as constructing estimators that achieve these limits. When the parameters to be estimated are deterministic, a popular approach is to bound the mean-squared error (MSE) achievable within the class of unbiased estimators. Although it is...
متن کامل